Moment convergence of regularized least-squares estimator for linear regression model
نویسندگان
چکیده
منابع مشابه
Moment convergence of regularized least-squares estimator for linear regression model
In this paper we study the uniform tail-probability estimates of a regularized leastsquares estimator for the linear regression model, by making use of the polynomial type large deviation inequality for the associated statistical random fields, which may not be locally asymptotically quadratic. Our results provide a measure of rate of consistency in variable selection in sparse estimation, whic...
متن کاملOptimal Rates for Regularized Least Squares Regression
We establish a new oracle inequality for kernelbased, regularized least squares regression methods, which uses the eigenvalues of the associated integral operator as a complexity measure. We then use this oracle inequality to derive learning rates for these methods. Here, it turns out that these rates are independent of the exponent of the regularization term. Finally, we show that our learning...
متن کاملStability Analysis for Regularized Least Squares Regression
We discuss stability for a class of learning algorithms with respect to noisy labels. The algorithms we consider are for regression, and they involve the minimization of regularized risk functionals, such as L(f) := 1 N PN i=1(f(xi) yi)+ kfkH. We shall call the algorithm ‘stable’ if, when yi is a noisy version of f (xi) for some function f 2 H, the output of the algorithm converges to f as the ...
متن کاملConvergence of Common Proximal Methods for L1-Regularized Least Squares
We compare the convergence behavior of ADMM (alternating direction method of multipliers), [F]ISTA ([fast] iterative shrinkage and thresholding algorithm) and CD (coordinate descent) methods on the model `1-regularized least squares problem (aka LASSO). We use an eigenanalysis of the operators to compare their local convergence rates when close to the solution. We find that, when applicable, CD...
متن کاملConvergence of Common Proximal Methods for 1-Regularized Least Squares
We compare the convergence behavior of ADMM (alternating direction method of multipliers), [F]ISTA ([fast] iterative shrinkage and thresholding algorithm) and CD (coordinate descent) methods on the model `1-regularized least squares problem (aka LASSO). We use an eigenanalysis of the operators to compare their local convergence rates when close to the solution. We find that, when applicable, CD...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Annals of the Institute of Statistical Mathematics
سال: 2016
ISSN: 0020-3157,1572-9052
DOI: 10.1007/s10463-016-0577-6